On Convergence of Kernel Learning Estimators
نویسندگان
چکیده
The paper studies kernel regression learning from stochastic optimization and ill-posedness point of view. Namely, the convergence properties of kernel learning estimators are investigated under a gradual elimination of the regularization parameter with rising number of observations. We derive computable non-asymptotic bounds on the deviation of the expected risk from its best possible value and obtain an optimal value for the regularization parameter that minimizes these bounds. We establish conditions for almost sure convergence of function estimates, jointly with a rule for downward adjustment of the regularization factor with increasing sample size.
منابع مشابه
Almost Sure Convergence of Kernel Bivariate Distribution Function Estimator under Negative Association
Let {Xn ,n=>1} be a strictly stationary sequence of negatively associated random variables, with common distribution function F. In this paper, we consider the estimation of the two-dimensional distribution function of (X1, Xk+1) for fixed $K /in N$ based on kernel type estimators. We introduce asymptotic normality and properties and moments. From these we derive the optimal bandwidth...
متن کاملEnsemble weighted kernel estimators for multivariate entropy estimation
The problem of estimation of entropy functionals of probability densities has received much attention in the information theory, machine learning and statistics communities. Kernel density plug-in estimators are simple, easy to implement and widely used for estimation of entropy. However, for large feature dimension d, kernel plug-in estimators suffer from the curse of dimensionality: the MSE r...
متن کاملThe Relative Improvement of Bias Reduction in Density Estimator Using Geometric Extrapolated Kernel
One of a nonparametric procedures used to estimate densities is kernel method. In this paper, in order to reduce bias of kernel density estimation, methods such as usual kernel(UK), geometric extrapolation usual kernel(GEUK), a bias reduction kernel(BRK) and a geometric extrapolation bias reduction kernel(GEBRK) are introduced. Theoretical properties, including the selection of smoothness para...
متن کاملComparison of the Gamma kernel and the orthogonal series methods of density estimation
The standard kernel density estimator suffers from a boundary bias issue for probability density function of distributions on the positive real line. The Gamma kernel estimators and orthogonal series estimators are two alternatives which are free of boundary bias. In this paper, a simulation study is conducted to compare small-sample performance of the Gamma kernel estimators and the orthog...
متن کاملUniformly Root-n Consistent Density Estimators for Weakly Dependent Invertible Linear Processes
Convergence rates of kernel density estimators for stationary time series are well studied. For invertible linear processes, we construct a new density estimator that converges, in the supremum norm, at the better, parametric, rate n. Our estimator is a convolution of two different residual-based kernel estimators. We obtain in particular convergence rates for such residual-based kernel estimat...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- SIAM Journal on Optimization
دوره 20 شماره
صفحات -
تاریخ انتشار 2009